17 research outputs found

    Supervised Hashing with End-to-End Binary Deep Neural Network

    Full text link
    Image hashing is a popular technique applied to large scale content-based visual retrieval due to its compact and efficient binary codes. Our work proposes a new end-to-end deep network architecture for supervised hashing which directly learns binary codes from input images and maintains good properties over binary codes such as similarity preservation, independence, and balancing. Furthermore, we also propose a new learning scheme that can cope with the binary constrained loss function. The proposed algorithm not only is scalable for learning over large-scale datasets but also outperforms state-of-the-art supervised hashing methods, which are illustrated throughout extensive experiments from various image retrieval benchmarks.Comment: Accepted to IEEE ICIP 201

    Selective Deep Convolutional Features for Image Retrieval

    Full text link
    Convolutional Neural Network (CNN) is a very powerful approach to extract discriminative local descriptors for effective image search. Recent work adopts fine-tuned strategies to further improve the discriminative power of the descriptors. Taking a different approach, in this paper, we propose a novel framework to achieve competitive retrieval performance. Firstly, we propose various masking schemes, namely SIFT-mask, SUM-mask, and MAX-mask, to select a representative subset of local convolutional features and remove a large number of redundant features. We demonstrate that this can effectively address the burstiness issue and improve retrieval accuracy. Secondly, we propose to employ recent embedding and aggregating methods to further enhance feature discriminability. Extensive experiments demonstrate that our proposed framework achieves state-of-the-art retrieval accuracy.Comment: Accepted to ACM MM 201

    From Selective Deep Convolutional Features to Compact Binary Representations for Image Retrieval

    Get PDF
    In the large-scale image retrieval task, the two most important requirements are the discriminability of image representations and the efficiency in computation and storage of representations. Regarding the former requirement, Convolutional Neural Network is proven to be a very powerful tool to extract highly discriminative local descriptors for effective image search. Additionally, to further improve the discriminative power of the descriptors, recent works adopt fine-tuned strategies. In this article, taking a different approach, we propose a novel, computationally efficient, and competitive framework. Specifically, we first propose various strategies to compute masks, namely, SIFT-masks , SUM-mask , and MAX-mask , to select a representative subset of local convolutional features and eliminate redundant features. Our in-depth analyses demonstrate that proposed masking schemes are effective to address the burstiness drawback and improve retrieval accuracy. Second, we propose to employ recent embedding and aggregating methods that can significantly boost the feature discriminability. Regarding the computation and storage efficiency, we include a hashing module to produce very compact binary image representations. Extensive experiments on six image retrieval benchmarks demonstrate that our proposed framework achieves the state-of-the-art retrieval performances. </jats:p

    Binary Constrained Deep Hashing Network for Image Retrieval Without Manual Annotation

    Get PDF
    Learning compact binary codes for image retrieval task using deep neural networks has attracted increasing attention recently. However, training deep hashing networks for the task is challenging due to the binary constraints on the hash codes, the similarity preserving property, and the requirement for a vast amount of labelled images. To the best of our knowledge, none of the existing methods has tackled all of these challenges completely in a unified framework. In this work, we propose a novel end-to-end deep learning approach for the task, in which the network is trained to produce binary codes directly from image pixels without the need of manual annotation. In particular, to deal with the non-smoothness of binary constraints, we propose a novel pairwise constrained loss function, which simultaneously encodes the distances between pairs of hash codes, and the binary quantization error. In order to train the network with the proposed loss function, we propose an efficient parameter learning algorithm. In addition, to provide similar / dissimilar training images to train the network, we exploit 3D models reconstructed from unlabelled images for automatic generation of enormous training image pairs. The extensive experiments on image retrieval benchmark datasets demonstrate the improvements of the proposed method over the state-of-the-art compact representation methods on the image retrieval problem.Comment: Accepted to WACV 201

    The global response: How cities and provinces around the globe tackled Covid-19 outbreaks in 2021

    Get PDF
    Background: Tackling the spread of COVID-19 remains a crucial part of ending the pandemic. Its highly contagious nature and constant evolution coupled with a relative lack of immunity make the virus difficult to control. For this, various strategies have been proposed and adopted including limiting contact, social isolation, vaccination, contact tracing, etc. However, given the heterogeneity in the enforcement of these strategies and constant fluctuations in the strictness levels of these strategies, it becomes challenging to assess the true impact of these strategies in controlling the spread of COVID-19.Methods: In the present study, we evaluated various transmission control measures that were imposed in 10 global urban cities and provinces in 2021 Bangkok, Gauteng, Ho Chi Minh City, Jakarta, London, Manila City, New Delhi, New York City, Singapore, and Tokyo.Findings: Based on our analysis, we herein propose the population-level Swiss cheese model for the failures and pit-falls in various strategies that each of these cities and provinces had. Furthermore, whilst all the evaluated cities and provinces took a different personalized approach to managing the pandemic, what remained common was dynamic enforcement and monitoring of breaches of each barrier of protection. The measures taken to reinforce the barriers were adjusted continuously based on the evolving epidemiological situation.Interpretation: How an individual city or province handled the pandemic profoundly affected and determined how the entire country handled the pandemic since the chain of transmission needs to be broken at the very grassroot level to achieve nationwide control

    Safety and efficacy of fluoxetine on functional outcome after acute stroke (AFFINITY): a randomised, double-blind, placebo-controlled trial

    Get PDF
    Background Trials of fluoxetine for recovery after stroke report conflicting results. The Assessment oF FluoxetINe In sTroke recoverY (AFFINITY) trial aimed to show if daily oral fluoxetine for 6 months after stroke improves functional outcome in an ethnically diverse population. Methods AFFINITY was a randomised, parallel-group, double-blind, placebo-controlled trial done in 43 hospital stroke units in Australia (n=29), New Zealand (four), and Vietnam (ten). Eligible patients were adults (aged ≥18 years) with a clinical diagnosis of acute stroke in the previous 2–15 days, brain imaging consistent with ischaemic or haemorrhagic stroke, and a persisting neurological deficit that produced a modified Rankin Scale (mRS) score of 1 or more. Patients were randomly assigned 1:1 via a web-based system using a minimisation algorithm to once daily, oral fluoxetine 20 mg capsules or matching placebo for 6 months. Patients, carers, investigators, and outcome assessors were masked to the treatment allocation. The primary outcome was functional status, measured by the mRS, at 6 months. The primary analysis was an ordinal logistic regression of the mRS at 6 months, adjusted for minimisation variables. Primary and safety analyses were done according to the patient's treatment allocation. The trial is registered with the Australian New Zealand Clinical Trials Registry, ACTRN12611000774921. Findings Between Jan 11, 2013, and June 30, 2019, 1280 patients were recruited in Australia (n=532), New Zealand (n=42), and Vietnam (n=706), of whom 642 were randomly assigned to fluoxetine and 638 were randomly assigned to placebo. Mean duration of trial treatment was 167 days (SD 48·1). At 6 months, mRS data were available in 624 (97%) patients in the fluoxetine group and 632 (99%) in the placebo group. The distribution of mRS categories was similar in the fluoxetine and placebo groups (adjusted common odds ratio 0·94, 95% CI 0·76–1·15; p=0·53). Compared with patients in the placebo group, patients in the fluoxetine group had more falls (20 [3%] vs seven [1%]; p=0·018), bone fractures (19 [3%] vs six [1%]; p=0·014), and epileptic seizures (ten [2%] vs two [<1%]; p=0·038) at 6 months. Interpretation Oral fluoxetine 20 mg daily for 6 months after acute stroke did not improve functional outcome and increased the risk of falls, bone fractures, and epileptic seizures. These results do not support the use of fluoxetine to improve functional outcome after stroke

    Simultaneous compression and quantization: A joint approach for efficient unsupervised hashing

    Get PDF
    For unsupervised data-dependent hashing, the two most important requirements are to preserve similarity in the low-dimensional feature space and to minimize the binary quantization loss. A well-established hashing approach is Iterative Quantization (ITQ), which addresses these two requirements in separate steps. In this paper, we revisit the ITQ approach and propose novel formulations and algorithms to the problem. Specifically, we propose a novel approach, named Simultaneous Compression and Quantization (SCQ), to jointly learn to compress (reduce dimensionality) and binarize input data in a single formulation under strict orthogonal constraint. With this approach, we introduce a loss function and its relaxed version, termed Orthonormal Encoder (OnE) and Orthogonal Encoder (OgE) respectively, which involve challenging binary and orthogonal constraints. We propose to attack the optimization using novel algorithms based on recent advance in cyclic coordinate descent approach. Comprehensive experiments on unsupervised image retrieval demonstrate that our proposed methods consistently outperform other state-of-the-art hashing methods. Notably, our proposed methods outperform recent deep neural networks and GAN based hashing in accuracy, while being very computationally-efficient

    The complete chloroplast genome of Mimosa pigra L. (Fabaceae), a notorious invasive plant

    No full text
    Mimosa pigra L., also called the giant sensitive tree, is native to tropical America and has invaded Africa, Asia, and Australia. Here, we report the complete chloroplast genome of M. pigra, which was 165,996 bp in length and composed of a large single-copy region (LSC; 93,299 bp), a small single-copy region (SSC; 17,989 bp) and two inverted repeat regions (IRs; 27,354 bp). The complete M. pigra chloroplast genome included 83 protein-coding genes, 37 tRNAs and 8 rRNAs. Phylogenetic analysis using the maximum likelihood method revealed the monophyly of M. pigra and related taxa of the subfamily Caesalpinioideae. In comparison, the members of Papilionoideae were paraphyletic
    corecore